Long short-term enhanced memory for sequential recommendation
نویسندگان
چکیده
Abstract Sequential recommendation is a stream of studies on recommender systems, which focuses predicting the next item user interacts with by modeling dynamic sequence user-item interactions. Since being born to explore tendency variable-length temporal sequence, Recurrent Neural Networks (RNNs) have been paid much attention in this area. However, inherent defects caused network structure RNNs limited their applications sequential recommendation, are mainly shown two factors: tend make point-wise predictions and ignore collective dependencies because relationships between items change monotonically; likely forget essential information during processing long sequences. To solve these problems, researchers done work enhance memory mechanism RNNs. although previous RNN-based methods achieved promising performance taking advantage external knowledge other advanced techniques, improvement intrinsic property existing has not explored, still challenging. Therefore, work, we propose novel architecture based Long Short-Term Memories (LSTMs), broadly-used variant RNNs, specific for called enhanced Memory (LSTeM) , boosts original LSTMs ways. Firstly, design new gates introducing “Q-K-V” triplet, accurately properly model correlation current user’s historical behaviors at each time step. Secondly, “recover gate” remedy inadequacy forgetting mechanism, works global embedding. Extensive experiments demonstrated that LSTeM achieves comparable state-of-the-art challenging datasets recommendation.
منابع مشابه
the effects of keyword and context methods on pronunciation and receptive/ productive vocabulary of low-intermediate iranian efl learners: short-term and long-term memory in focus
از گذشته تا کنون، تحقیقات بسیاری صورت گرفته است که همگی به گونه ای بر مثمر ثمر بودن استفاده از استراتژی های یادگیری لغت در یک زبان بیگانه اذعان داشته اند. این تحقیق به بررسی تاثیر دو روش مختلف آموزش واژگان انگلیسی (کلیدی و بافتی) بر تلفظ و دانش لغوی فراگیران ایرانی زیر متوسط زبان انگلیسی و بر ماندگاری آن در حافظه می پردازد. به این منظور، تعداد شصت نفر از زبان آموزان ایرانی هشت تا چهارده ساله با...
15 صفحه اولLong Short-term Memory
Model compression is significant for the wide adoption of Recurrent Neural Networks (RNNs) in both user devices possessing limited resources and business clusters requiring quick responses to large-scale service requests. This work aims to learn structurally-sparse Long Short-Term Memory (LSTM) by reducing the sizes of basic structures within LSTM units, including input updates, gates, hidden s...
متن کاملLong Short-Term Memory
Learning to store information over extended time intervals by recurrent backpropagation takes a very long time, mostly because of insufficient, decaying error backflow. We briefly review Hochreiter's (1991) analysis of this problem, then address it by introducing a novel, efficient, gradient-based method called long short-term memory (LSTM). Truncating the gradient where this does not do harm, ...
متن کاملSequential dynamics in visual short-term memory.
Visual short-term memory (VSTM) is thought to help bridge across changes in visual input, and yet many studies of VSTM employ static displays. Here we investigate how VSTM copes with sequential input. In particular, we characterize the temporal dynamics of several different components of VSTM performance, including: storage probability, precision, variability in precision, guessing, and swappin...
متن کاملGrid Long Short-Term Memory
This paper introduces Grid Long Short-Term Memory, a network of LSTM cells arranged in a multidimensional grid that can be applied to vectors, sequences or higher dimensional data such as images. The network differs from existing deep LSTM architectures in that the cells are connected between network layers as well as along the spatiotemporal dimensions of the data. The network provides a unifi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: World Wide Web
سال: 2022
ISSN: ['1573-1413', '1386-145X']
DOI: https://doi.org/10.1007/s11280-022-01056-9